Robust Bilinear Probabilistic Principal Component Analysis
نویسندگان
چکیده
Principal component analysis (PCA) is one of the most popular tools in multivariate exploratory data analysis. Its probabilistic version (PPCA) based on maximum likelihood procedure provides a manner to implement dimension reduction. Recently, bilinear PPCA (BPPCA) model, which assumes that noise terms follow matrix variate Gaussian distributions, has been introduced directly deal with two-dimensional (2-D) for preserving structure 2-D data, such as images, and avoiding curse dimensionality. However, distributions are not always available real-life applications may contain outliers within sets. In order make BPPCA robust outliers, this paper, we propose model under assumption t terms. The alternating expectation conditional maximization (AECM) algorithm used estimate parameters. Numerical examples several synthetic publicly sets presented demonstrate superiority our proposed feature extraction, classification outlier detection.
منابع مشابه
Probabilistic Principal Component Analysis
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive o...
متن کاملMixtures of robust probabilistic principal component analyzers
Mixtures of probabilistic principal component analyzers model high-dimensional nonlinear data by combining local linear models. Each mixture component is specifically designed to extract the local principal orientations in the data. An important issue with this generative model is its sensitivity to data lying off the low-dimensional manifold. In order to address this problem, the mixtures of r...
متن کاملSparse Probabilistic Principal Component Analysis
Principal component analysis (PCA) is a popular dimensionality reduction algorithm. However, it is not easy to interpret which of the original features are important based on the principal components. Recent methods improve interpretability by sparsifying PCA through adding an L1 regularizer. In this paper, we introduce a probabilistic formulation for sparse PCA. By presenting sparse PCA as a p...
متن کاملRobust Kernel Principal Component Analysis
Kernel Principal Component Analysis (KPCA) is a popular generalization of linear PCA that allows non-linear feature extraction. In KPCA, data in the input space is mapped to higher (usually) dimensional feature space where the data can be linearly modeled. The feature space is typically induced implicitly by a kernel function, and linear PCA in the feature space is performed via the kernel tric...
متن کاملRobust Stochastic Principal Component Analysis
We consider the problem of finding lower dimensional subspaces in the presence of outliers and noise in the online setting. In particular, we extend previous batch formulations of robust PCA to the stochastic setting with minimal storage requirements and runtime complexity. We introduce three novel stochastic approximation algorithms for robust PCA that are extensions of standard algorithms for...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Algorithms
سال: 2021
ISSN: ['1999-4893']
DOI: https://doi.org/10.3390/a14110322